This article originally appeared in The Bar Examiner print edition, Spring 2018 (Vol. 87, No. 1), pp 3–6.
By Judith A. Gundersen
The February 2018 bar exam is mostly in the books. As I write this column, we have yet to upload MBE scores to the jurisdictions. The jurisdictions then have to finish grading the written components of their exams. (If you’ve read my column in the Winter 2017–2018 issue, you’ll know why it takes NCBE four weeks to report MBE scores and why it takes jurisdictions usually several more weeks to report pass/fail results.) It was a pretty quiet exam—something for which we are always grateful even as we plan for contingencies. And we know it’s spring here in Madison—at least that’s our fervent hope!—because the ice-fishing shanties across the street on Monona Bay have packed up for the season.
This is our popular statistics issue, where we devote almost the entire magazine to what the numbers tell us about bar exam pass rates and admissions in the past year as well as trends over prior years. Google Analytics show that the statistics section sees a lot of traffic on our website. Clearly, all stakeholders in admissions recognize the value in analyzing data to help all of us do the best jobs we can in our places on the continuum from admission to law school to admission to the bar. Many different entities have data points that tell the story of the backgrounds of those who have sat for the bar exam, starting with entering credentials like LSAT scores (and, maybe in the future, GRE scores) and undergraduate degrees and GPAs, to law degrees (i.e., J.D. or foreign-educated LL.M.) and law school GPAs, and ending with MPRE and bar exam performance.
The statistics compiled for this issue tell us about trends in the numbers of candidates sitting for the bar exam and the MPRE and what scores they are earning. While the past two July MBE means have been trending upward, a longer 10-year view shows us that the July MBE mean is down from 145.6 in 2008 to 141.7 in 2017. February results are also down from 137.7 in 2008 to 134.1 in 2017. (In fact, the February 2017 results show a more precipitous drop when compared with the mean in 2011, for instance, which reached 138.6, and the means in 2013 and 2014, which were both at 138.0.) And, not surprisingly, the number of test takers is down, too, over the decade. In 2008, 70,833 examinees sat for the bar exam; in 2017, that number was 68,896, nearly a three percent decrease. MPRE figures tell a similar story. The MPRE mean scaled score in 2008 was 97.6 and the number of examinees was 60,392; in 2017, the mean scaled score was 93.5 for 58,384 examinees.
In general, February results are always more difficult to predict because the population that sits in February tends to differ in some significant ways from the July population, the most important difference being that the percentage of repeat takers is significantly higher in February than in July. For example, in February 2017, the national percentage of repeat takers (from ABA-approved law schools) was 58%; in July 2017 it was 19%, which represents a threefold number of repeat takers in February 2017 compared with July 2017.
For 2017, a new category has been added to our compilation—the number of examinees earning a Uniform Bar Examination score and the number transferring a UBE score. As UBE jurisdictions and scores become an increasingly large presence in admissions, it becomes ever more relevant to capture this important piece of admission statistics. We are pleased to report that in 2017 alone, 26,897 UBE scores were earned and 3,776 UBE examinees sent their scores to another UBE jurisdiction. For examinees, that means 3,776 exams not taken, thousands of dollars saved, and opportunity cost minimized. As UBE jurisdictions and scores become an increasingly large presence in admissions, it becomes ever more relevant to capture this important piece of admission statistics. We are pleased to report that in 2017 alone, 26,897 UBE scores were earned and 3,776 UBE examinees sent their scores to another UBE jurisdiction. For examinees, that means 3,776 exams not taken, thousands of dollars saved, and opportunity cost minimized.
Compilation of these statistics is a joint venture between NCBE (kudos to our editor Claire Guback and graphic artist Amy Kittleson) and the jurisdictions. Claire sends out the call for statistics to bar admission offices, and their staff members supply her with numbers for all the charts you see in this issue. We offer our sincere thanks to the administrators in the jurisdictions for the hard work that goes into getting this information to Claire. This issue would obviously not exist without their help.
The examinee-related statistics in this issue are mostly at the macro level: how many examinees sat in each jurisdiction, percentage passing, number of first-time and repeat takers, and type of legal education (e.g., ABA-accredited school). One of the challenges NCBE faces in gathering and reporting statistics is that bar exam registration and record keeping for information such as essay scores and pass/fail status happen at the jurisdiction level rather than at NCBE, and not all jurisdictions gather the same information, particularly demographic information such as ethnicity. And NCBE and the jurisdictions have long had a shared understanding that it is the jurisdictions that determine who sits, what information is gathered, and what information is released and to whom. In other words, NCBE does not release the score data on a jurisdiction basis to anyone other than the jurisdictions themselves.
But recognizing the importance of insights that can be gleaned from more uniform and granular bar-exam-related data, NCBE is committed to exploring with jurisdictions what the benefits, logistics, and impediments might be to creating a national database of bar exam registrants. The national statistics published in this issue do not tell us, for example, what percentage of examinees are women or minorities, or which examinees have earned an LL.M. and not a J.D. degree (which appears to be an increasingly large proportion of examinees). That information would allow NCBE’s measurement staff to conduct and publish research that tracks national trends in the characteristics of examinees along with performance on NCBE exams. In addition, such research could be used to provide jurisdictions with an analysis of the characteristics of their applicants at a particular bar exam administration and across time.
Until recently, examinees were identified by five-digit Applicant ID numbers that were generated and assigned by each jurisdiction for each exam administration. While those Applicant IDs allowed each jurisdiction to identify the scores of its examinees for a particular exam administration, they could not be used to identify examinees on a national basis or across exam administrations, because they were not unique. One way that NCBE has made data analysis easier is through the NCBE Number—a unique identifier that is assigned typically when an examinee registers for the MPRE. The NCBE Number was introduced in 2013, and as of the February 2018 examination, 47 of the 52 jurisdictions that administer a February exam (which excludes the two jurisdictions that do not administer the MBE—Louisiana and Puerto Rico) instructed examinees to enter NCBE Numbers on their MBE answer sheets. Use of the NCBE Number by jurisdictions not only helps with identification of examinees across jurisdictions and exam administrations for research purposes, but also serves to strengthen exam security by identifying individuals who might try to take the examination multiple times in multiple jurisdictions for purposes of “harvesting” MBE questions (that is, committing MBE questions to memory, or capturing them electronically or by other means, for dissemination to future examinees).
The UBE is also changing the data collection landscape. NCBE serves as the repository of scores (scaled MEE/MPT and MBE scores, and total UBE scores) for all UBE jurisdictions and is authorized by the jurisdictions to transfer UBE scores to other UBE jurisdictions and release scores to UBE examinees. But, as mentioned above, NCBE is not authorized to share or publish jurisdiction-specific UBE scores—it is up to the jurisdictions to release this information. For non-UBE jurisdictions, NCBE has only MBE scores, unless the jurisdictions provide us with written scores to do research on their behalf; but in that case, NCBE cannot share or release the scores, and the research is shared only with the jurisdiction that requested it and cannot be published unless the jurisdiction authorizes us to do so.
Of course, jurisdictions themselves also face challenges in collecting and disseminating data. There are privacy concerns if data are released at the examinee level. And jurisdictions often face significant resource issues in responding to requests for years’ worth of statistics. Finally, all of us (and I think this applies to other stakeholders on the law school–bar admissions continuum) have a duty to examinees and, in the case of NCBE and jurisdictions, to the Courts and bar associations that are charged with regulatory authority to be sure that any research undertaken using exam-related data is done only in the context of a well-defined plan that follows established scientific and measurement protocols to reach sound conclusions, and that appropriate security measures are in place to ensure that privacy is not compromised.
You can be sure that as we plan for future collection and dissemination of data, NCBE and the jurisdictions will continue to collaborate and seek ways to efficiently compile and report more complete data that matters to all stakeholders.
And while we’re looking ahead to what the future might hold in terms of uniform data collection, NCBE’s newly constituted Testing Task Force is looking ahead at our evolving profession and the role of the bar exam in assessing the critical competencies needed by newly licensed lawyers of the future. The members of the Task Force, which is chaired by Hon. Cynthia Martin (MO), are Hulett (Bucky) Askew (GA); Diane Bosse (NY); David Boyd (AL); Michele Gavagni (FL); Anthony Simon (MS); Timothy Wong (MN); Hon. Rebecca Berch (ret.) (AZ), ex-officio; and myself. The Task Force is staffed by Kellie Early, NCBE’s Chief Strategy Officer, and Dr. Joanne Kane, NCBE’s Associate Director of Testing. Task Force members each have vast amounts of bar examining experience and have the perspective of the Courts, the boards, admissions offices, and law schools. And they are ably supported by Kellie and Joanne.
The Task Force will identify the critical competencies (knowledge, skills, and abilities) that should be possessed by entry-level practitioners to protect the public. Protecting the public is broader than simply protecting the consumer interest and includes professional commitment to the rule of law and to professional principles. In addition, the Task Force will be assessing test formats, timing of the exam, and exam delivery. We are appointing a Technical Advisory Panel of measurement experts to advise the Task Force on the technical/psychometric issues and will engage independent professionals to conduct any research necessary to inform our study.
We intend this study to take a “clean slate” approach with input and guidance from Courts, bar examiners, and jurisdiction administrators. We also plan to seek input from other stakeholders such as law schools, bar associations, and employers. Such stakeholders are not only an important source of relevant information that will inform the Task Force’s study; they are also critical to the credibility of the Task Force’s recommendations. We are excited, and we look forward to giving jurisdictions ample opportunity for input.
Writing about the future reminds me of how important it is to remember and understand the past and those who were instrumental in shaping NCBE into the institution it is today. Three of these people recently passed away. Margo Melli of the University of Wisconsin Law School and John Reed of the University of Michigan Law School were leaders in legal education who served as chairs of NCBE drafting committees. Margo was also the first women to chair NCBE’s Board of Trustees—a feat in its own right! Marvin Barkin of Florida was NCBE’s chair from 1993 to 1994. Among other accomplishments, he was instrumental in hiring Erica Moeser, our former president. There aren’t enough pages in this magazine to describe the rich lives they led, the contributions they made to their professions, and what wonderful people they were. I leave that to Robert Potts and the Hon. Solomon Oliver, Jr., whose remembrances for these three leaders and friends in our bar admissions community appear in this issue.
Until the next issue,
Judith A. Gundersen
Contact us to request a pdf file of the original article as it appeared in the print edition.